专利摘要:
techniques and mechanisms for determining a lens system configuration. in one embodiment, respective distances from a reference are determined for each of a plurality of objects that are observable through the lens system. based on object distances, in-focus object counts are determined, each for a corresponding focal configuration of the lens system. each such in-focus object count represents a total number of objects that are (or would be) in focus during the corresponding focal configuration, wherein a respective of the plurality of objects is at a depth close to the field of the corresponding focal configuration. in another embodiment, a preference of one focal configuration over another focal configuration is determined based on counts of objects in focus.
公开号:BR112019009917B1
申请号:R112019009917-4
申请日:2017-07-20
公开日:2021-08-31
发明作者:Katrina Passarella;Vlad Cardei
申请人:Google Llc;
IPC主号:
专利说明:

BACKGROUND 1. Technical Field
[001] This disclosure relates generally to the field of optics and in particular, but not exclusively, relates to the operation of an image sensor with a variable focus lens system. 2. Background
[002] In optics, depth of field ("DOF") is the interval in a scene between a closer distance and a farther distance, between which the distances to objects in the image can appear acceptably sharp. A fixed focus lens can only accurately focus on a single depth within a scene, as this sharpness gradually decreases on both sides of that focus distance. Objects that fall within the depth of field are considered to have acceptable sharpness.
[003] Digital imaging devices, such as digital cameras, generally include a lens assembly that focuses the image light onto an image sensor that measures the image light and generates an image based on the measurements. A variable focus lens can adjust its focus distance so that it can be focused at different distances at different times. This allows the imaging device to translate depth of field to focus on objects at any of a variety of distances. Conventional imaging devices often support autofocus functionality to facilitate changing the focal length. As the number and variety of form factors of imaging devices continues to grow over time, it is expected that there will be an increasing demand for solutions that provide responsive and/or efficient autofocus functionality.
[004] The document JP 2014 235224 A refers to an imaging device comprising an image part that photographs depth of field in time series and consecutively and generates a plurality of images; a focus adjustment part that adjusts a depth-of-field focus status; a subject detection part that detects a plurality of main subjects from the image: and a control part that acquires at least an amount of a characteristic of the plurality of detected main subjects or an amount of another characteristic different from the amount of the characteristic, and causes the focus adjustment part to adjust the depth-of-field focus status so that the number of main objects settles within the maximum depth of field, based on the feature quantity and/or the quantity. of another feature.
[005] WO 2009/007860 A1 discloses a device that includes logic to capture an image, logic to detect a plurality of faces in the image, logic to calculate a distance associated with each face, logic to calculate a depth of field based on on the distance associated with each face, and logic for calculating focus and exposure settings to capture the image based on the depth of field associated with the plurality of faces.
[006] Document US 8,655,162, B2 describes the definition of a lens position based on focus scores. A plurality of starting positions of a lens is determined. Each of the start positions can correspond to a lens position where one of a plurality of objects has a higher quality. A focus score can be determined at each of the start positions for the corresponding object with the highest quality. An end lens position between two of the start positions can be calculated based on the focus scores. BRIEF DESCRIPTION OF THE DRAWINGS
[007] The various embodiments of the present invention are illustrated by way of example, and not limitation, in the figures of the attached drawings and in which:
[008] FIG. 1 is a functional block diagram illustrating elements of a system for determining a focal configuration according to one embodiment.
[009] FIG. 2 is a flowchart illustrating elements of a method for operating an image sensing device according to an embodiment.
[0010] FIG. 3A is a plan view of an environment including a device for providing autofocus capability in accordance with one embodiment.
[0011] FIG. 3B is a graph illustrating processes performed to determine a focal configuration in accordance with one embodiment.
[0012] FIG. 4 is a flow diagram illustrating elements of a process for determining a focal configuration according to an embodiment.
[0013] FIG. 5 shows several views illustrating features of an environment in which an image sensing device is to provide autofocus capability according to one embodiment. DETAILED DESCRIPTION
[0014] The embodiments described herein provide various techniques and mechanisms for determining a focused type to be provided with a lens system. Such a lens system may be configured, in some embodiments, based on an assessment of whether objects observable through the lens system may be variously in focus or out of focus, for example, given a particular aperture and/or another operational feature of an image sensor optically coupled to the lens system. Where it is determined that a lens system configuration results in (or would result in) more objects in focus compared to another lens system configuration, a signal may be generated, according to an embodiment, to indicate a relative preference of the system configuration of one lens over the system configuration of other lenses.
[0015] In the following description, numerous specific details are set out to provide a complete understanding of the embodiments. One skilled in the art will recognize, however, that the techniques described herein may be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other cases, structures, materials, or features are not shown or described in detail to avoid obscuring certain aspects. Reference throughout this specification to "an embodiment" or "an embodiment" means that a particular feature, structure or feature described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the expressions "in an embodiment" or "in an embodiment" at various places throughout this specification do not necessarily all refer to the same embodiment. Furthermore, particular elements, structures or features may be combined in any suitable way in one or more embodiments.
[0016] In some conventional digital images, if there is only one subject in a scene, an auto focus (AF) algorithm usually adjusts the lens position to set a focal length on that subject. Some embodiments are based on a realization by the inventors that, in some circumstances, this approach may not be ideal for scenes with multiple objects, such as human faces, located at various distances. Such embodiments improve on conventional autofocus techniques by providing mechanisms that recognize that one focus field, compared to another focus field, can result in a greater number and/or better arrangement of objects in focus.
[0017] As used herein, "field of view" refers to the portion of an environment that is observable through a lens system. A field of view can refer, for example, to that portion of a three-dimensional environment, which image can be captured as a two-dimensional image through a particular lens system directed at the portion. The term "focus field" (also "focus field" or "focal field") refers to that portion of the field of view in which an object or objects, as viewed through the lens system, will be sufficiently in focus to de according to some predetermined criteria. A given focus field - which may depend in part on a given aperture of the imaging system, for example - comprises a respective focal length and a respective depth of field.
[0018] A "focal length" is a distance from some reference point (eg a center of a lens in the lens system) to the center of the focal field. A "depth of field" is a total depth of field of focus (eg measured along a direction line extending from/to the reference point). Depth of field, also known as the "focus range", extends between a close depth of field and a far depth of field. The term "near depth of field" refers to a distance to a closer edge of depth of field, measured from a reference point, such as a center of a lens in the lens system. Similarly, "distant depth of field" refers to a distance from the reference point to an edge farther away from the depth of field.
[0019] For many optical systems, the relationship between focal length(s) and near depth of field (Dn) can, for example, be generally represented by the following equation:
where the hyperfocal distance (H) is the closest distance at which a lens system can be focused while keeping objects at infinity acceptably sharp. Normally, when the lens is focused at the hyperfocal distance, all objects at distances from half the hyperfocal distance to infinity will be acceptably sharp. The hyperfocal distance of an imaging system may be different for different configurations (eg, different apertures) of that imaging system.
[0020] The relationship between Dn, if H can also be generally represented by the following equations:
and the relationship between depth of field (Df), if H can be generally represented by the following equations:

[0021] However, any one of a variety of additional or alternative criteria, for example including one or more equations adapted from conventional imaging techniques, can be used to identify relationships between several among Dn, Df, H and s.
[0022] The phrase "focal configuration" here refers to a given configuration of a lens system, for example one of multiple possible configurations, which is to facilitate the provision of a corresponding focus field. The focus field can be that of the lens system alone. Alternatively, the focus field can be a general focus field provided by the lens system in combination with one or more other devices (eg including one or more lenses, a specific aperture structure, circuitry for running image focus software and/or the like).
[0023] The embodiments described here determine, for example, automatically, a relative preference of a focal configuration over another focal configuration, where such determination is based on the evaluation of an object or objects that may be in a focus field. Unless otherwise noted, "identified object" or "identified objects" refers here to one or more objects that are in a field of view of a lens system and that have been identified (eg, including being distinguished from one another) as having a respective distance from some reference, such as a point located in or on a lens system. Such one or more objects may include only a subset of a larger plurality of objects observable through the lens system (eg, where the subset only includes objects that occupy at least a minimal part of the field of view).
[0024] As used herein, a "set of objects in focus" refers to a set of those one or more identified objects that are, or would be, in focus as seen with the lens system during a particular focal configuration thereof. Various focal settings of the lens system can therefore correspond to different sets of objects in focus. Of one or more objects in a given set of in-focus objects, an object that is closer to the lens system may be referred to as a "closer-focused object", whereas an object that is further away from the lens system is referred to as a "further focused object". Thus, multiple sets of in-focus objects can comprise different objects in closer focus and/or objects in focus farther away. A "focus object count" (for brevity, also referred to here as an "object count") refers here to a total number of the one or more objects in a set of focus objects.
[0025] FIG. 1 illustrates elements of a system 100, according to one embodiment, for determining a focal configuration to be implemented for use in an image capture operation. System 100 is just one example of an embodiment configured to determine, based on respective distances to objects that are within a field of view, a preference of one focal configuration over another focal configuration. Such preference can be determined, for example, based on calculated scores for each of the focal configurations. Scores can be equal to or based on a respective count of focus objects associated with a corresponding focus field.
[0026] In the illustrative embodiment shown, system 100 includes a lens system 110 comprising one or more lenses, such as illustrative lens 112 shown, for receiving light 105 from an external environment. Lens system 110 can include any of a variety of optical devices that accommodate adjustable focusing capability. Such an optical device can be controlled, based on the techniques described herein, using one or more focus adjustment mechanisms adapted from conventional autofocus technology.
[0027] The lens system 110 can be optically coupled to direct light 105 from the external environment towards an image sensor 120 of the system 100, for example, where the light output by the lens system 110 is focused through an aperture 122 in a pixel array 124. The pixel array of 124 may include complementary metal-oxide semiconductor (CMOS) pixels and/or any of a variety of other pixels adapted from conventional image detection techniques . Some embodiments are not limited to a specific pixel array architecture for use in generating image data based on light 105. In some embodiments, a configuration of lens system 110 is to be determined given a particular size of aperture 122, for example, where image sensor 120 is a fixed aperture device or where a focal configuration is to be selected from a plurality of possible focal configurations for use in combination with a particular aperture 122 size.
[0028] For example, system 100 may further comprise a distance sensor 140 configured to operate as a rangefinder to detect objects in a field of view that is observable with the lens system 110. Detecting object distances with the sensor distance 140 may include one or more operations adapted from conventional telemetry techniques, which are not detailed herein to avoid obscuring features of various embodiments. By way of illustration and not limitation, the distance sensor 140 may provide the functionality of a laser rangefinder, an ultrasonic rangefinder, or an infrared rangefinder. Other means for range detection are possible, such as light detection and telemetry (LIDAR), radio detection and telemetry (RADAR), microwave telemetry, etc.
[0029] The distance sensor 140 can be coupled to output signals 142 to the distance evaluation circuit 150 of the system 100. The distance evaluation circuit 150 can comprise logic, for example, including an application-specific integrated circuit (ASIC). ), processor circuit, state machine, and/or other semiconductor hardware configured to detect, based on signals 142, that multiple objects are in an observable field of view through lens system 110. Some or all of these objects may be distinguishable from each other by different respective distances from system 100.
[0030] Telemetry with the distance sensor 140 and the distance evaluation circuit 150 may include active sensing techniques, passive sensing techniques (for example, including phase detection, contrast measurement and/or the like) or a combination the same. In one embodiment, the distance evaluation circuit 150 may identify, for each object of a plurality of objects, a respective distance to that object relative to some reference location within or within system 100. This identification may be based on , for example, in a threshold response to a laser and/or other telemetry signal output from the distance sensor 140. Such a lower threshold response can limit the plurality of identified objects to those objects that occupy at least some amount of lower limit of the field of view. Alternatively or additionally, such a lower bound response may limit the plurality of objects identified to those objects in the field of view that are within some upper bound distance from the system 100.
[0031] Although some embodiments are not limited in this regard, the distance sensor 140 can provide directional telemetry functionality that identifies, for different respective parts of the field of view, a distance to a respective object, at least part of which occupies that portion of the field of view. For example, distance sensor 140 can be operated to sequentially (or otherwise) scan the field of view, where corresponding response signals, received by distance sensor 140 in sequence, are each associated with a different respective portion. of the field of view. In such an embodiment, the distance evaluation circuitry 150 may correspond to multiple object distances, each with a respective different portion of the field of view.
[0032] The system 100 may further comprise selection circuits 160 coupled to receive, from the distance evaluation circuit 150, information that specifies or otherwise indicates the respective distances of the plurality of objects from the system 100. The circuit selection panel 160 may comprise logic, for example, including an ASIC, processor circuitry, and/or the like, to determine, based on such object distances, a focal configuration to implement with lens system 110. Such determination may include circuitry selection 160 identifying a relative preference of some first focal configuration over a second focal configuration. This preference can be based, for example, on determining that the first focal configuration, compared to the second focal configuration, would result in a greater number and/or better arrangement (indicated by some punctuation or other metric) of objects in focus.
[0033] In an illustrative embodiment, selection circuit 160 includes or accesses reference information describing one or more relationships between a focal length, near depth of field (Dn), far depth of field (Df), hyperfocal distance (H) and/or any one of several other optical characteristics to be provided with lens system 100. By way of illustration and not limitation, selection circuit 160 may comprise or be coupled to a memory 130 that is pre- programmed with this reference information, for example, by a manufacturer, retailer, computer network service, or other agent. Based on object distance data from distance evaluation circuits 150, selection circuits 160 can access reference information in memory 130 to select, calculate and/or otherwise determine, for a given distance to an object. , a total number of objects that are (or would be) in focus during a corresponding focal setting of the lens system 110. For example, the focal setting may correspond to the object in question being located at a close depth of field to be provided with the lens system 110.
[0034] Based on an evaluation of multiple possible focal configurations, selection circuit 160 may output a signal 162 that identifies or indicates a focal configuration that has been determined to be preferred in at least one alternative focal configuration. In response to signal 162, a focus controller (FC) 170 of system 100 may adjust or otherwise configure a focus field to be implemented with lens system 110. FC 170 may include any of a variety of one or more hardware and/or software mechanisms for changing an effective focal length provided with lens system 110. By way of illustration and not limitation, FC 170 may include a motor for moving the lens of lens system 110 in relation to each other and/or to pixel array 124. Alternatively or additionally, FC 170 may include logic (e.g. including an ASIC, processor, execution software and/or the like) which, e.g., is to implement at least in part a focus field by image processing calculations. Based on signal 162, FC 170 can implement a focal configuration that provides a particular depth of field with lens system 110. During such focal configuration, image sensor 120 can be operated, for example, responsive to selection circuits 160 and /or FC 170, to capture an image of the external environment.
[0035] Although some embodiments are not limited in this regard, the distance evaluation circuitry 150 may still include or be coupled to image recognition circuitry (not shown) that are configured to receive and process image information generated by the pixel array 124 based on light received via lens system 110. Such image recognition circuitry may, for example, be pre-programmed with (or have access to) other reference information describing one or more classes of objects. Based on this other reference information, the image recognition circuit can evaluate the signals from pixel array 124 to determine whether any region of the field of view includes a representation of some object belonging to a predefined object class. Some examples of object classes include, but are not limited to, an eye class, mouth class, head class, automobile class, build class, and/or the like. The identification of one or more objects of an object class can include operations adapted from conventional image recognition techniques. In one embodiment, one or more object distances variously indicated by signs 142 may each be associated with a respective object which is identified as belonging to a corresponding object class.
In summary, the device (system 100) shown in FIG. 1 may comprise a combination of at least some of the following elements: a lens system 110 for receiving light from an environment external to the device; distance evaluation circuit 150 configured to identify respective distances for each of a plurality of objects including at least a first object and a second object, which distance evaluation circuit 150 can be coupled to a distance sensor 140 for processing output signals 142 from distance sensor 140 and for determining the distance to a respective object based on evaluating signals 142; selection circuit 160 coupled to distance evaluation circuit 150 to determine a focus of lens system 110, selection circuit 160 including logic that, when executed, causes the device (system 100) to perform operations including: determining a first count of any one of the plurality of objects in focus, while the first object is at a first near depth of field Dn, due to a first focal setting (eg a certain lens system setting 110) and due to a first aperture 122 (e.g., size of first aperture 122 which focuses light onto an array of pixels 124 of an image sensor 120 optically coupled to lens system 110 to capture an image received with lens system 110); determining a second count of any one of the plurality of objects in focus while the second object is at a second depth near Dn of field, due to a second focal configuration and due to the first aperture; comparing a first score based on the first count and a second score based on the second count; and providing, based on the comparison, a signal 162 that indicates a preference between the first focal configuration or the second focal configuration.
[0037] Here, the first or second score of a first/second count can be based on a value of the first count or a value of the second count, respectively. For example, a respective score can be derived from an associated weighted score where individual objects included in the score can be given different weights. If such weights are neglected, the first score and the second score may be equal to the first score and the second score, respectively. Otherwise, a score may be based, at least in part, on a weighted value that has been assigned to an object in the field of view.
[0038] In addition, the device may comprise a coupled focus controller 170 to adjust the lens system 110 based on the signal 162.
[0039] The operations that can be performed by the device (system 100) also define a method to provide an autofocus capability based on object distance information which will be described in more detail below for an embodiment and which may be implemented by executing corresponding instructions stored on a computer-readable, non-transient storage medium.
[0040] FIG. 2 illustrates elements of a method 200 for determining a lens system focal configuration in accordance with one embodiment. To illustrate certain features of various embodiments, method 200 is described herein with reference to an example scenario illustrated in FIGs. 3A, 3B. FIG. 3A shows a top side view of an environment 300 in which an image sensor device 310 is to operate according to one embodiment. FIG. 3B shows a view 350 of various distances, as projected on a single line 360, from image sensor device 310 to respective objects in environment 300. Method 200 can be performed with one or more components of image sensor device 310, by example, where image sensor device 310 includes some or all of the features of system 100. However, other embodiments include method 200 to be performed by any of a variety of other image sensor devices having features herein. described.
[0041] In one embodiment, method 200 includes, at 210, identifying respective distances for each of a plurality of objects in a field of view of a lens system. For example, as shown in FIG. 3A, the image sensor device 310 can be positioned (located and oriented) such that some multiple objects, such as the six illustrative objects A to F shown, are each in the field of view 320 (e.g., between lines of view 322, 324) which is observable through a lens system 312 of the image sensor device 310. Objects A to F are not limiting to some embodiments, and the image sensor device 310 can be configured to perform the 200 method based on more, less objects, and/or differently arranged objects.
[0042] Positions of objects in the field of view 320 can be identified at least in part with reference, for example, to a polar coordinate system (for example, part of a cylindrical or spherical coordinate system) comprising a dimension of distance x and a radial dimension θ. In the illustrative scenario shown, field of view 320 located object A at location (x1, θ4), object B at location (x2, θ3), object C at location (x3, θ2), object D at location (x4, θ5 ), object E at location (x5, θ6) and object F at location (x6, θ1). In the example scenario shown in FIG. 3 A, objects A to F are each in a two-dimensional plane. However, it will be appreciated that some or all of these objects may be located at different vertical heights in a three-dimensional space, for example where the vertical height component of an object's location may result in some additional distance between the object and the sensor. Image. 310. The determination at 210 may include, for example, identifying distances x1, x2, x3, x4, x5, x6, for example, where such identification is performed with distance evaluation circuitry 150 based on signals 142.
[0043] The method 200 may further comprise performing a first determination of a focal configuration to be implemented (with the lens system 312, in the example of Figure 3A). Such a determination, also referred to herein as a "first focus determination" for brevity, may determine a lens system focus, for example, including operations to provide a comparative evaluation of at least two focal configurations based on respective in-focus object counts. . For example, the first focus determination may include, at 220, determining a first count of any one of several objects in focus, while a first object of the plurality of objects is at a first close depth of field, due to a first focal setting. of the lens system and to a first aperture (i.e., a particular aperture size that can be fixed or, alternatively, adjustable). The first count can represent a first total number of any one of the plurality of objects that would appear in focus if viewed with the lens system, while the first focus field is implemented with the first focal setting and the first aperture.
[0044] The first focus determination may further comprise, at 230, determining a second count of any one of the plurality of objects in focus, while a second object of the plurality of objects is at a second near depth of field, due to a second focal configuration and the first aperture. The second count can represent a second total number of any one of the plurality of objects that would appear in focus if viewed with the lens system, while the second focus field is implemented with a second lens system focal configuration and the first aperture . The determination at 230 may include counting a total number of objects from a second set of in-focus objects that correspond to the second focal configuration.
[0045] FIG. 3B illustrates an example of a focal configuration determination (such as the determination at 220 and 230) comprising counting, for each of a plurality of focal configurations, a respective count of objects from a set of in-focus objects that corresponds to this focal configuration. This count (referred to here as an "in-focus object count") may include setting a near depth-of-field variable to equal a distance from a specific object, and calculating or determining a greater depth-of-field value corresponds to , for example, must be concurrent with a depth close to the field value. The distances identified at 210 can then be evaluated to determine which objects are between the near depth of field and the corresponding far depth of field.
[0046] For example, as shown in view 350, a focal configuration determination can perform a first count of in-focus objects for a D1 focus field, where a close depth of field of D1 must be at the same distance (x1- x0) of the image sensor 310 how object A is. The first object count in focus can determine that only one of the objects A to F, ie, object A, is (or would be) in focus when the 312 lenses have a first focal setting to facilitate the D1. A second count of in-focus objects can be performed for a focus field D2 where a depth of field close to D2 must be at the same distance (x2-x0) from the image sensor 310 as is object B. The second count of objects in focus can determine that a total of three of the objects, ie, objects B, C, and D, are or would be in focus when lens system 312 has a second focal setting to facilitate D2.
[0047] Determining the focal setting can still perform a third in-focus object count for a D3 focus field where a close depth of field of D3 must be at the same distance (x3-x0) from the 310 image sensor as it is. object C. The third in-focus object count can determine that a total of two of the objects, ie, objects C and D, are or would be in focus when lens system 312 has a third focal setting to facilitate D3. A fourth in-focus object count can be performed for a D4 focus field where a close depth of field of D4 must be at the same distance (x4-x0) from the image sensor 310 as is the D object. in-focus objects can determine that a total of two of the objects, ie, objects D and E, are or would be in focus when lens system 312 has a fourth focal setting to facilitate D4.
[0048] The focal setting determination can further perform a fifth in-focus object count for a D5 focus field where a close depth of field of D5 must be at the same distance (x5-x0) from the 310 image sensor as it is. object E. The fifth in-focus object count can determine that only one object, object E, is or would be in focus when lens system 312 has a fifth focal setting to facilitate D5. A sixth in-focus object count can be performed for a D6 focus field, where a close depth of field of D6 must be at the same distance (x6-x0) from the image sensor 310 as is object F. The sixth in-focus object count can determine that only one object, ie, object F, is or would be in focus when the 312 lens system has a sixth focal setting to facilitate D6. The respective D1-D6 depths of field can be substantially equal, for example, within 10% of each other and, in some embodiments, within 5% of each other.
[0049] Determining the focal configuration performed by mode 200 may further comprise, at 240, performing a comparison of a first score based on the first score determined at 220 and a second score based on the second score determined at 230. For example, the first score score and second score can be equal to first score and second score, respectively. In another embodiment, a score may be based, at least in part, on a weighted value that has been assigned to an object in the field of view. The assignment of such a weighted value can be based, for example, on a location of the object in question in the field of view. By way of illustration and not limitation, a weight value can be assigned to an object based, at least in part, on an object's location relative to a reference point or a reference line (eg, a center, a line average, an edge and/or a corner) of the field of view.
[0050] Alternatively or additionally, such a weight value may be assigned to the object based, at least in part, on its position in the field of view in relation to one or more other objects that are also in the field of view. Such weighting value may additionally or alternatively be assigned based, at least in part, on an object class type that has been identified, by image recognition processing, as corresponding to the object.
[0051] Based on a result of the comparison performed at 240, method 200 may further comprise, at 250, providing a signal indicating a preference between the first focal configuration or the second focal configuration. For example, the signal may specify or otherwise indicate that the first focal configuration is to be preferred over the second focal configuration. The signal provided at 250 may indicate that the first focus field, to be provided with the first focal setting, will result in a higher number of, and/or a better weighted score for, focus objects compared to a second focus field. focus that could be provided by the second focal configuration.
[0052] Referring again to the example scenario shown in FIG. 3B, the sign at 250 may indicate a preference for the focal configuration that facilitates D2 over the focal configuration that facilitates D1. Such a signal can identify the focal configuration which, from a plurality of such configurations, will result in the greatest number of objects in focus. In some embodiments, the signal provided at 250 may be generated independently of any count of objects that should be in focus while each of the plurality of objects is shifted from a near depth of field, e.g., independent of any determination. of an in-focus object count that corresponds to a different focal setting than one that is to place one of the plurality of objects at a close depth of field. For example, a focal configuration determination performed for objects A through F in field of view 320 may include performing only six counts of in-focus objects, i.e., each for a respective one of the focus fields D1 through D6 shown.
[0053] Conventional techniques for determining lens focus variably scan through a range of focal lengths, performing respective calculations for each of a large set of focus fields. This larger set typically includes many focus fields for which no identified object is (or would be) located at near depth of field. Rather, some embodiments calculate scores for a relatively smaller and more particular set of focus fields, for example, the total number of which may be no more than a total number of the plurality of objects. The comparative evaluation of only D1 to D6, for example, without also evaluating many other intermediate fields of focus, each among the respective ones from D1 to D6, illustrates an efficiency obtained by many of these embodiments. Compared to conventional techniques, such embodiments are more efficient providing relatively simpler and thus faster processing for evaluating focus fields.
[0054] Although some embodiments are not limited in this regard, method 200 may comprise one or more additional operations (not shown) to operate an image sensing device based on the signal provided at 250. For example, method 200 it may further comprise configuring the lens system based on the signal provided at 250, for example, wherein the lens system implements the first configuration to locate at a close depth of field the first object. In such an embodiment, an object other than the first object may be an object closer (of the entire plurality of objects) to the set of lenses. After configuring the lens system, method 200 can further operate an array of pixels to capture an image received with the lens system. While some embodiments are not limited in this regard, method 200 may be repeated one or more times, for example including selection circuits 160 (for example) performing one or more additional focus determinations from a plurality of focus determinations. including the first focus determination. For example, some or all of these plurality of focus determinations may each correspond to a respective different aperture that is for operating with the lens system.
[0055] FIG. 4 illustrates elements of a method 400 for determining a focal configuration according to one embodiment. Method 200 can be performed with system 100 or with image sensor device 310, for example. In one embodiment, method 200 includes some or all of the features of method 200.
[0056] In the illustrative embodiment shown, method 400 includes operations to initialize variables used in determining a preferred focal configuration. By way of illustration and not limitation, such operations may include, at 405, setting to zero each of a variable Dmax representing a currently preferred near field of focus and another variable Nmax representing a count of objects in focus corresponding to Dmax. Operations in 405 may additionally or alternatively include setting a counter variable x to an initial value, for example, one (1).
[0057] Method 400 may further comprise, in 410, determining a distance dx of the x-th object (where x-th is an ordinal corresponding to a current value of the variable x) of a plurality of objects that have been determined within of a field of view that is observable through a lens system. Distance dx can be determined relative to a reference location, such as a center point in or over a lens of the lens system. On the 415, method 400 can determine an Nx value representing a count of objects in focus, that is, a count of objects that are (or would be) in focus while the lens system has a focal setting that places the xth object in a field close to focus. The determination at 415 may include one or more operations such as the determination at 220 or the determination at 230, for example.
[0058] Method 400 may further include determining, at 420, whether the most recently determined Nx value at 415 is greater than a current value of Nmax. Where it is determined at 420 that Nmax is greater than the current value of Nmax, method 400 can perform operations, at 425, including setting Nmax to equal the most recently determined value of Nx. Operations on 425 may further comprise adjusting Dmax to equal the most recently determined value of Dn. Subsequently, a determination can be made, at 430, as to whether any other object of the plurality of objects remains to be addressed by method 400. Where, instead, it is determined at 420 that Nmax is less than (or equal to, at some embodiments) the current value of Nmax, method 400 can open an instance of the operations at 425 and proceed to the determination at 430.
[0059] In response to a determination at 430 that each of the plurality of objects has been addressed, method 400 may proceed, or be followed by, subsequent operations (not shown) to implement in the lens system a focal configuration that provides a near depth of field equal to the most recent Dmax value. Where instead it is determined at 430 that at least one of the plurality of objects has not been addressed, method 400 may increment the counter x by 435 and proceed to execute (for the newly incremented value of x) another instance of determination in 410.
[0060] In one embodiment, the initial xth object, that is, a first object, of the plurality of objects to be addressed by method 400 is an object closest to the plurality of objects for the lens system. Every next xth object to be addressed by method 400 can, for example, be a near farthest object in the lens system. In such an embodiment, one or more additional test conditions (not shown) can be evaluated to determine whether an output of method 400 should be performed.
[0061] By way of illustration and not limitation, an early exit from method 400 may be performed in response to a determination, for example, at 420, that Nx represents the xth object and all other objects (of the plurality of objects) that are further away from the lens system than the xth object. Alternatively or additionally, an early exit from method 400 may be performed in response to a determination, for example, at 420, that any subsequent evaluation of Nx could not be greater than the current value of Nmax (eg, objects successively according to with the ascending order of their respective distances from the lens system).
[0062] FIG. 5 illustrates features of an embodiment where a focal configuration of a lens system is determined based on the respective scores for objects in a field of view, where the scores are in turn determined based on different assigned weight values to several of the objects. Such determination can be carried out, for example, by a system 100 or image sensor device 310, for example, according to one of the methods 200, 400.
[0063] FIG. 5 shows an upper side view of an environment 500 in which an image sensor device 510 is to operate in accordance with one embodiment. As shown in FIG. 5, the image sensor device 510 may be positioned (located and oriented) so that objects, e.g., the illustrative plurality of objects A through F shown, are each in field of view 520, between lines of sight 522, 524, which is observable through a lens system 512 of the image sensor device 510. The image sensor device 510 can be positioned to additionally or alternatively create images on more, fewer objects and/or differently arranged objects, in various embodiments.
[0064] The insertion of FIG. 5 shows an example of view 550 of objects A to F (being people in the illustrative scenario) that are within field of view 520, view 550 as seen through lens system 512. A focal configuration for lens system 512 can be determined, for example, based in part on the respective distances x1 to x6, of objects A to F of lens system 512. In some embodiments, such determination of a focal configuration can be additionally based on the respective locations of some or all objects A through F in view 550.
[0065] For example, pre-programmed reference information, for example, stored in memory 130, may correspond to different viewing regions 550, each with a respective value indicating a degree of value placed on objects in that region. In the illustrative embodiment shown, these regions include a region 554 in which a center of view 550 (the center aligned with centerline 526 of field of view 520) is located. The regions may further include a region 556 adjacent and extending around region 554, as well as another region 558 adjacent and extending around region 556. Yet another region around 558 may extend to an edge 552 of the view 550.
[0066] For a given of these regions, an object identified as being in the region may receive a weight that is equal or otherwise based on the predefined preference value associated with that region. In the illustrative embodiment shown, object A may be assigned a first weight corresponding to region 554, and each object B, C and D may receive a second weight corresponding to region 558. Objects E and F may each receive a third weight corresponding to the region joining and surrounding region 558.
[0067] A score Sx can be calculated for a given focal setting Cx of the lens system 512, for example, where Cx provides a close depth of field that is equal to the distance of the xth object of the plurality of objects in the lens system 512. In one embodiment, an Sx value can be calculated according to the following:
where I is an integer equal to a total number of the plurality of objects, Bix is a Boolean value that is equal to "1" if the zth object is or would be in focus during Cx (and equal to "0" if opposite) and Wi is a weight value associated with the viewing region 550 in which the zth object is located. In some other embodiments, a given weight value Wi may be additionally or alternatively determined based on an object type to which a corresponding zth object belongs. By way of illustration and not limitation, a Wi weighting value may be relatively more significant when image recognition processing has identified the corresponding zth object as being an instance of a human-faced object type (or part of it) . The assignment of specific weights to the respective objects can be based on any of a wide variety of possible object type preferences that are pre-programmed or pre-determined, for example, by a manufacturer, retailer, user or other agent. In an illustrative embodiment, a relatively more significant weight (e.g., greater value) can be assigned to objects of one human-faced object type, compared to one or more alternative object types. However, the techniques by which such preferences are to be determined may depend on specific implementation details, and may not be limiting in some embodiments. In contrast to a focus object count, an Sx value (or a Wi value) can be a number other than any integer, for example. Equation (6) is just an example of calculation to determine Sx for a given xth object. Any of a variety of other calculations to determine a value of Sx can be performed, according to different embodiments.
[0068] Some embodiments may calculate respective Sx values for two or more of, for example, each of the objects that are identified as located in the field of view. Lens system 512 may subsequently be configured based on such evaluation of such scores. For example, a focal setting of lens system 512 can be implemented based on that focal setting having a larger Sx value. In some scenarios, two or more focal configurations may have the same Sx value, for example, where that Sx value is greater than any other calculated Sx values. In such an embodiment, one of the two or more focal configurations may be selected for implementation based on that focal configuration having a shorter focal length compared to the other of the two or more focal configurations.
[0069] The techniques and architectures to operate an optical device are described here. Some portions of the detailed description presented here are presented in terms of algorithms and symbolic representations of operations on data bits within a computer's memory. These descriptions and algorithmic representations are the means used by experts in the field of computing to more effectively convey the substance of their work to other experts in the field. An algorithm is here, and in general, designed to be a self-consistent sequence of steps that leads to a desired result. Steps are those that require physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has sometimes proved convenient, mainly for reasons of common usage, to refer to these signs as bits, values, elements, symbols, characters, terms, numbers, or the like.
[0070] It should be kept in mind, however, that all of these terms and the like must be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities. Unless specifically indicated otherwise as apparent from the discussion herein, it is appreciated that throughout the description, discussions using terms such as "processing" or "computation" or "calculation" or "determination" or "display" or the like refer to the actions and processes of a computer system, or similar electronic computing device, which manipulates and transforms data represented as physical (electronic) quantities in the computer system's records and memories into other data similarly represented as physical quantities within of computer system memories or registers or other such devices for storing, transmitting or displaying information.
[0071] Certain embodiments also relate to apparatus for performing the operations described herein. This apparatus may be specially constructed for the required purposes, or may comprise a general purpose computer selectively activated or reconfigured by a computer program stored on the computer. Such computer program may be stored on a computer-readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs and magneto-optical disks, read-only memories (ROMs) , random access memories (RAMs), such as dynamic RAM (DRAM), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions coupled to a computer system bus.
[0072] The algorithms and displays presented here are not inherently related to any particular computer or other device. Various general purpose systems can be used with programs in accordance with the teachings herein, or it may be convenient to build more specialized apparatus to carry out the required method steps. The structure needed for a variety of these systems will appear from the description here. Furthermore, certain embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages can be used to implement the teachings of such embodiments as described herein.
[0073] In addition to what is described here, various modifications can be made to the disclosed embodiments and implementations thereof without departing from its scope. Therefore, the illustrations and examples contained herein are to be interpreted in an illustrative and not restrictive manner. The scope of the invention is to be measured solely by reference to the claims that follow.
权利要求:
Claims (22)
[0001]
1. Device (100), characterized in that it comprises: a lens system (110) for receiving light (105) from an environment external to the device; distance evaluation circuit (150) configured to identify respective distances for each of a plurality of objects including a first object and a second object; selection circuit (160) coupled to the distance evaluation circuit (150) to determine a focus of the lens system (110), the selection circuit (160) including the logic that, when executed, causes the device (100 ) perform operations including: adjusting the lens system to have a first focal setting and a first aperture at which the first object is at a first close depth of field; determine a first count of any of the plurality of objects that are in focus while the first object is at the first close depth of field, due to the first focal setting and the first aperture, where the first count represents a number of objects in focus, while the first object is in the first near depth of field; adjust the lens system to have a second focal setting and the first aperture at which the second object is at a second close depth of field; determine a second count where any of the plurality of objects are in focus, while the second object is at the second close depth of field, due to the second focal setting and the first aperture, where the second count represents a number of objects in focus while the second object is in the second near depth of field; compare a first score based on the first count that results from the first focal configuration and a second score based on the second count that results from the second focal configuration, to identify a preference between the first focal configuration and the second focal configuration based on which configuration focal provides a greater number of objects in focus; and providing, based on the comparison of the first score with the second score, a sign (162) indicating the preference between the first focal configuration and the second focal configuration based on which focal configuration provides a greater number of objects in focus; a focus controller (170) coupled to adjust the lens system (110) based on the signal (162); and an optically coupled image sensor (310) to capture an image received with the lens system (110) after the lens system has been adjusted based on the signal.
[0002]
2. Device according to claim 1, characterized in that the selection circuit (160) is configured to make the device generate the signal (162) that indicates the preference between the first focal configuration and the second focal configuration independently the device performs any object count in focus while the entire plurality of objects is shifted from a close depth of field to a present focal configuration.
[0003]
3. Device according to claim 1, characterized in that the selection circuit (160) is configured to cause the device to determine one or more other foci of the lens system (110) each of the one or more of the other focuses due in part to a respective aperture different from the first aperture.
[0004]
4. Device according to claim 1, characterized in that the signal indicates the preference for the first focal configuration, and an object different from the first object, of the plurality of objects, is an object closer to the set of lenses while the lens system has the first focal setting.
[0005]
5. Device according to claim 1, characterized in that the selection circuit is configured to cause the device to calculate the first score based on a first weight value assigned to an object based on a location of the object in the field of view (320).
[0006]
6. Device according to claim 5, characterized in that the location of the object in the field of view (320) is relative to a reference point or a reference line of the field of view.
[0007]
7. Device according to claim 6, characterized in that the reference point is a center of the field of view (320).
[0008]
8. Device according to claim 5, characterized in that the location of the object in the field of view (320) is relative to another object of the plurality of objects.
[0009]
9. Device according to claim 5, characterized in that the first score includes a number different from any whole number.
[0010]
10. Device according to claim 1, characterized in that the selection circuit is configured to make the device calculate the first score based on a first weight value assigned to an object based on a type of object object.
[0011]
11. Device according to claim 10, characterized in that the object type includes a human-faced object type.
[0012]
12. Computer readable non-transient storage medium characterized in that it comprises stored instructions which, when executed by one or more processing units, cause the one or more processing units to execute a method comprising: identifying respective distances to each of a plurality of objects in a field of view (320) of a lens system (110), the plurality of objects including a first object and a second object; determining a focus of the lens system (110), including: adjusting the lens system to have a first focal setting and a first aperture at which the first object is at a first close depth of field; determine a first count of any of the plurality of objects that are in focus while the first object is at the first close depth of field, due to the first focal setting and the first aperture, where the first count represents a number of objects in focus, while the first object is in the first near depth of field; adjust the lens system to have a second focal setting and the first aperture at which the second object is at a second close depth of field; determine a second count of any of the plurality of objects that are in focus, while the second object is in the second close depth of field, due to the second focal setting and the first aperture, where the second count represents a number of objects in focus while the second object is at the second close depth of field; compare a first score based on the first count that results from the first focal configuration and a second score based on the second count that results from the second focal configuration, to identify a preference between the first focal configuration and the second focal configuration based on which configuration focal provides a greater number of objects in focus; and providing, based on the comparison of the first score with the second score, a sign (162) indicating the preference between the first focal configuration and the second focal configuration based on which focal configuration provides a greater number of objects in focus; adjust the lens system (110) based on the signal (162); and capturing an image received with the lens system (110) after the lens system has been adjusted based on the signal.
[0013]
13. A computer-readable, non-transient storage medium according to claim 12, characterized in that the signal (162) indicating the preference between the first focal configuration and the second focal configuration is generated independently of any count being performed. of objects in focus, while the entire plurality of objects is shifted from a close depth of field to a present focal configuration.
[0014]
14. Computer-readable non-transient storage medium according to claim 12, characterized in that the method further comprises determining the one or more other foci of the lens system (110), each of the one or more of the other focuses due in part to a respective aperture different from the first aperture.
[0015]
15. Computer-readable non-transient storage medium according to claim 12, characterized in that the signal indicates preference for the first focal configuration, and an object different from the first object of the plurality of objects is a more close to the lens array while the lens system has the first focal setting.
[0016]
16. A computer-readable, non-transient storage medium according to claim 12, characterized in that the method further includes calculating the first score based on a first weight value assigned to an object based on a location of the object in the field of view (320).
[0017]
17. Computer-readable, non-transient storage medium according to claim 16, characterized in that the location of the object in the field of view (320) is relative to a reference point or a reference line of the field of view. vision.
[0018]
18. Computer-readable, non-transient storage medium according to claim 16, characterized in that the location of the object in the field of view (320) is relative to another object of the plurality of objects.
[0019]
19. Method, characterized in that it comprises: identifying respective distances for each of a plurality of objects in a field of view (320) of a lens system (110), the plurality of objects including a first object and a second object; determining a focus of the lens system (110), including: adjusting the lens system to have a first focal setting and a first aperture at which the first object is at a first close depth of field; determine a first count of any of the plurality of objects that are in focus, while the first object is at a first close depth of field, due to the first focal setting and the first aperture, where the first count represents a number of objects in focus , while the first object is in the first near depth of field; adjust the lens system to have a second focal setting and the first aperture at which the second object is at a second close depth of field; determine a second count of any of the plurality of objects that are in focus, while the second object is in the second close depth of field, due to the second focal setting and the first aperture, where the second count represents a number of objects in focus, while the second object is at the second close depth of field; compare a first score based on the first count that results from the first focal configuration and a second score based on the second count that results from the second focal configuration, to identify a preference between the first focal configuration and the second focal configuration based on which configuration focal provides a greater number of objects in focus; and providing, based on the comparison between the first score and the second score, a sign (162) indicating the preference between the first focal configuration and the second focal configuration based on which focal configuration provides a greater number of objects in focus; adjust the lens system (110) based on the signal (162); and capturing an image received with the lens system (110) after the lens system has been adjusted based on the signal.
[0020]
20. Method according to claim 19, characterized in that the signal (162) that indicates the preference between the first focal configuration and the second focal configuration is generated independently of performing any counting of objects in focus while the entire plurality of objects is shifted from a near depth of field to a present focal configuration.
[0021]
21. Method according to claim 19, characterized in that the signal indicates the preference for the first focal configuration, and an object different from the first object, of the plurality of objects, is an object closer to the set of lenses while the lens system has the first focal setting.
[0022]
22. The method of claim 19, further including calculating the first score based on a first weight value assigned to an object based on a location of the object in the field of view (320).
类似技术:
公开号 | 公开日 | 专利标题
BR112019009917B1|2021-08-31|DEVICE, STORAGE MEANS AND METHOD TO PROVIDE AN AUTOMATIC FOCUS CAPACITY BASED ON OBJECT DISTANCE INFORMATION
CN105659580B|2018-09-07|A kind of Atomatic focusing method, device and electronic equipment
US10264176B2|2019-04-16|Gaze tracking device and method and recording medium for performing the same
JP2014057303A|2014-03-27|System and method for utilizing enhanced scene detection in depth estimation procedure
CN108401457A|2018-08-14|A kind of control method of exposure, device and unmanned plane
WO2017143745A1|2017-08-31|Method and apparatus for determining movement information of to-be-detected object
US10565461B2|2020-02-18|Live facial recognition method and system
US11003939B2|2021-05-11|Information processing apparatus, information processing method, and storage medium
US20160093046A1|2016-03-31|Apparatus and method for supporting computer aided diagnosis
JP5776323B2|2015-09-09|Corneal reflection determination program, corneal reflection determination device, and corneal reflection determination method
US20200226789A1|2020-07-16|Camera calibration plate, camera calibration method and device, and image acquisition system
JP2017049426A|2017-03-09|Phase difference estimation device, phase difference estimation method, and phase difference estimation program
US20170098139A1|2017-04-06|Method and a system for identifying reflective surfaces in a scene
TWI687689B|2020-03-11|Measurement device and measurement method for rotation of round body and non-transitory information readable medium
CN105763805B|2019-03-08|Control method, control device and electronic device
US10692225B2|2020-06-23|System and method for detecting moving object in an image
JP6346018B2|2018-06-20|Eye measurement system, eye detection system, eye measurement method, eye measurement program, eye detection method, and eye detection program
US10354164B2|2019-07-16|Method for detecting glint
EP3279864B1|2019-03-06|A method for obtaining parameters defining a pixel beam associated with a pixel of an image sensor comprised in an optical device
Tordoff2002|Active control of zoom for computer vision
US8655162B2|2014-02-18|Lens position based on focus scores of objects
KR101827114B1|2018-03-22|Apparatus and method for detecting proximal entity in pen
JP2009059165A|2009-03-19|Outline detection apparatus, sight line detection apparatus using the same, program for causing computer to remove false outline data, program for causing computer to detect sight line direction, and computer-readable recording medium with the program recorded
US9721151B2|2017-08-01|Method and apparatus for detecting interfacing region in depth image
CN106101542A|2016-11-09|A kind of image processing method and terminal
同族专利:
公开号 | 公开日
WO2018093423A1|2018-05-24|
GB201715163D0|2017-11-01|
KR20190085964A|2019-07-19|
DE202017105587U1|2018-03-19|
JP2020502559A|2020-01-23|
US20180139376A1|2018-05-17|
CN108076268B|2020-04-10|
CN108076268A|2018-05-25|
KR102138845B1|2020-07-28|
GB2555942B|2020-01-15|
US10027879B2|2018-07-17|
JP6960455B2|2021-11-05|
GB2555942A|2018-05-16|
DE102017121395A1|2018-05-17|
BR112019009917A2|2020-01-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

DE3437145C2|1984-10-10|1996-03-28|Ricoh Kk|Automatic focusing device for photographic cameras|
GB2196134B|1986-09-30|1990-12-19|Canon Kk|Camera having auto focus apparatus|
JPH02254432A|1989-03-29|1990-10-15|Canon Inc|Automatic focusing camera|
DE4104113C2|1990-02-14|1996-05-23|Asahi Optical Co Ltd|Camera with a motorized zoom lens|
US5532782A|1994-02-17|1996-07-02|Nikon Corporation|Camera having a depth priority operation mode|
JPH085915A|1994-06-15|1996-01-12|Minolta Co Ltd|Fixed focus type zoom lens|
US7099575B2|2002-07-08|2006-08-29|Fuji Photo Film Co., Ltd.|Manual focus device and autofocus camera|
JP4794963B2|2005-06-28|2011-10-19|キヤノン株式会社|Imaging apparatus and imaging program|
RU2384968C1|2005-12-06|2010-03-20|Панасоник Корпорэйшн|Digital camera|
US7646977B2|2007-01-17|2010-01-12|Gateway, Inc.|Depth of field bracketing|
US20090015681A1|2007-07-12|2009-01-15|Sony Ericsson Mobile Communications Ab|Multipoint autofocus for adjusting depth of field|
US20110002680A1|2009-07-02|2011-01-06|Texas Instruments Incorporated|Method and apparatus for focusing an image of an imaging device|
US8229172B2|2009-12-16|2012-07-24|Sony Corporation|Algorithms for estimating precise and relative object distances in a scene|
US8655162B2|2012-03-30|2014-02-18|Hewlett-Packard Development Company, L.P.|Lens position based on focus scores of objects|
US9065993B1|2012-07-31|2015-06-23|Google Inc.|Fixed focus camera with lateral sharpness transfer|
JP2014235224A|2013-05-31|2014-12-15|株式会社ニコン|Imaging device and control program|
US9477138B2|2013-06-10|2016-10-25|Apple Inc.|Autofocus|
JP6018029B2|2013-09-26|2016-11-02|富士フイルム株式会社|Apparatus for determining main face image of captured image, control method thereof and control program thereof|
TWI522720B|2013-10-29|2016-02-21|Nat Univ Chung Cheng|Adjust the focus method|
CN105827928A|2015-01-05|2016-08-03|中兴通讯股份有限公司|Focusing area selection method and focusing area selection device|CN108663803B|2017-03-30|2021-03-26|腾讯科技(深圳)有限公司|Virtual reality glasses, lens barrel adjusting method and device|
CN112333378A|2018-07-23|2021-02-05|深圳市真迈生物科技有限公司|Imaging method, device and system|
CN112291469A|2018-07-23|2021-01-29|深圳市真迈生物科技有限公司|Imaging method, device and system|
US11153483B2|2018-08-24|2021-10-19|Bossa Nova Robotics Ip, Inc.|Shelf-viewing camera with multiple focus depths|
US10554918B1|2018-12-06|2020-02-04|Alibaba Group Holding Limited|High definition, large capture volume, camera array system|
CN110225244B|2019-05-15|2021-02-09|华为技术有限公司|Image shooting method and electronic equipment|
CN112073601B|2019-06-11|2022-02-01|杭州海康微影传感科技有限公司|Focusing method, device, imaging equipment and computer readable storage medium|
WO2021118279A1|2019-12-11|2021-06-17|Samsung Electronics Co., Ltd.|Electronic apparatus and method for controlling thereof|
US20210326064A1|2020-04-15|2021-10-21|Micron Technology, Inc.|Media type selection based on object distance|
CN111521119B|2020-04-24|2021-07-23|北京科技大学|Casting blank thickness self-adaptive online image acquisition method|
法律状态:
2021-06-29| B15K| Others concerning applications: alteration of classification|Free format text: A CLASSIFICACAO ANTERIOR ERA: H04N 5/232 Ipc: H04N 5/232 (2006.01), G02B 7/28 (2021.01), G03B 13 |
2021-07-06| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2021-08-31| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 20/07/2017, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US15/352,503|US10027879B2|2016-11-15|2016-11-15|Device, system and method to provide an auto-focus capability based on object distance information|
US15/352,503|2016-11-15|
PCT/US2017/043017|WO2018093423A1|2016-11-15|2017-07-20|Device, system and method to provide an auto-focus capability based on object distance information|
[返回顶部]